Theory of fisher linear discriminant analysis and its application 線性鑒別分析的理論研究及其應(yīng)用
A study on personal credit scoring using linear discriminant analysis 線性判別式分析在個(gè)人信用評(píng)估中的應(yīng)用
A new two - dimensional linear discriminant analysis algorithm based on fuzzy set theory 基于模糊集理論的二維線性鑒別分析新方法
In this paper , we focus on two - class discriminating problem and chiefly study two types of linear discriminant analysis : principal component classifier ( pcc ) and fisher linear discriminant analysis ( flda ) 本文就兩分類問題,研究了兩種線性判別:主分量分類器和fisher判別分析。
Linear projection analysis , including principal component analysis ( or k - l transform ) and fisher linear discriminant analysis , is the classical and popular technique for feature extraction 線性投影分析,包括主分量分析(或稱k - l變換)和fisher線性鑒別分析,是特征抽取中最為經(jīng)典和廣泛使用的辦法。
A face - recognition algorithm based on fisher linear discriminant analysis is studied in detail which combines principal component analysis ( pca ) based eigenface method and linear discriminant analysis ( lda ) method 該方法將基于主成分分析( pca )的特征臉方法和基于線性判別分析( lda )的分類方法有機(jī)的結(jié)合起來。
The inherent relationship between fisher linear discriminant analysis and karhunen - loeve expansion is revealed , i . e . , ulda is essentially equivalent to one classical k - l expansion method . moreover , we enhance ulda using the idea of another k - l expansion method , and finally an optimal k - l expansion method is developed 揭示了具有統(tǒng)計(jì)不相關(guān)性的線性鑒別分析與經(jīng)典的k - l展開方法的內(nèi)在關(guān)系,即不相關(guān)的線性鑒別分析方法與包含在類均值向量中判別信息的最優(yōu)壓縮方法是等價(jià)的,并在此基礎(chǔ)上導(dǎo)出了一種最優(yōu)k - l展開方法。
Feature extraction through 2 - order polynomial fit of the descending part of the response curve made possible a timesaving measurement process . the performances of two pattern recognition algorithms , namely principal component analysis ( pca ) and linear discriminant analysis ( lda ) in practical problems were discussed . artificial neural network ( ann ) was utilized with back - propagation algorithm ( bpa ) , and the combination of pca / lda with ann improved the identification performance of the system 基于對(duì)模式識(shí)別系統(tǒng)的深入研究,提出了從響應(yīng)階段數(shù)據(jù)提取特征的方法,節(jié)省了測(cè)試所需時(shí)間;比較了主成分分析法( principalcomponentanalysis , pca )與線性判別式法( lineardiscriminantanalysis , lda )兩種模式識(shí)別方法在實(shí)際應(yīng)用中的不同結(jié)果,分析了原因;設(shè)計(jì)了采用誤差反傳算法back - propagationalgorithm , bpa )的前向人工神經(jīng)網(wǎng)絡(luò)( artificialneuralnetwork , ann ) ,并指出其應(yīng)用中存在的問題,提出了改進(jìn)建議;利用pca lda與ann相結(jié)合的方法改善了系統(tǒng)的識(shí)別性能。
The conventional principal component analysis ( pca ) and fisher linear discriminant analysis ( lda ) are based on vectors . that is to say , if we use them to deal with the image recognition problem , the first step is to transform original image matrices into same dimensional vectors , and then rely on these vectors to evaluate the covariance matrix and to determine the projector 所提出的這兩種方法的共同特點(diǎn)是,在進(jìn)行圖像特征抽取時(shí),不需要事先將圖像矩陣轉(zhuǎn)化為高維的圖像向量,而是直接利用圖像矩陣本身構(gòu)造圖像散布矩陣,然后基于這些散布矩陣進(jìn)行主分量分析與線性鑒別分析。
Linear discriminant analysis (LDA) and the related Fisher's linear discriminant are methods used in statistics, pattern recognition and machine learning to find a linear combination of features which characterizes or separates two or more classes of objects or events. The resulting combination may be used as a linear classifier, or, more commonly, for dimensionality reduction before later classification.